AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
2048 context length

# 2048 context length

Nomic Xlm 2048
A fine-tuned version based on the XLM-Roberta base model, using RoPE (Rotary Position Embedding) to replace the original positional embeddings, supporting 2048 sequence length
Large Language Model Transformers
N
nomic-ai
440
6
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase